Big Tech will spend nearly $700 billion on AI this year. No one knows where the buildout ends
Apr 30, 2026
Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: SoftBank plans to list a new AI and robotics company in the US…AI model’s goblin habit, explained…Putting Google’s AI to the test as a trip planner.
If Big Tech’s AI spending spree were like climbing Mount Everest,
they would still be ascending toward the summit, getting dizzy from the altitude.
In quarterly earnings, estimates from Alphabet, Amazon, Meta and Microsoft put combined capital expenditures at more than $130 billion for the quarter, driven by buildouts of data centers and other infrastructure. That spending could surpass $700 billion this year, up sharply from about $410 billion last year. While only Alphabet has explicitly pointed to further increases beyond this year, all four companies signaled sustained high levels of investment as demand for AI infrastructure continues to grow.
The market reaction has been mixed. Shares of Meta fell sharply after its earnings report as investors focused on the scale of its AI spending plans, and Microsoft also slipped. By contrast, Alphabet and Amazon rose on strong cloud growth—highlighting a growing divide on Wall Street over whether this buildout is justified or getting ahead of itself.
There’s no doubt that AI companies—from the hyperscalers to startups like OpenAI and Anthropic—are hungry, if not starving, for more computing power. The scale of today’s AI systems, which require far more hardware, energy, and coordination than earlier generations of software, means that more is almost never enough. The result is a surge in spending unlike anything the industry has seen before: : McKinsey research from last year found that by 2030, AI capex is projected to require $6.7 trillion worldwide to keep pace with the demand for compute power.
Spending big on physical infrastructure
It’s important to understand how much of that spending is going directly into the physical infrastructure that supports AI—both training frontier models and running them. But it can be hard to wrap your mind around the scale of this buildout.
It starts with chips—the specialized silicon semiconductors designed to perform the calculations used in AI. A single GPU from Nvidia, for example, can cost up to $40,000. But companies don’t buy them one at a time; they buy systems. An eight-GPU server can cost hundreds of thousands of dollars, and the clusters needed for hyperscale AI data centers—made up of thousands or even hundreds of thousands of GPUs—can run into the billions.
Then there are the data centers that house and power those systems. Pack tens or hundreds of thousands of GPUs into a cluster of buildings spread across hundreds or thousands of acres, and the result starts to look less like a traditional tech investment and more like a utility-scale project—consuming as much electricity as a small city. Last month, I looked closely at Meta’s $27 billion Hyperion data center project in northeast Louisiana, which some estimate will use millions of GPUs.
Another key piece is networking—the cables and switches that connect thousands of chips so they can work together. Training and running modern AI models requires constant, high-speed communication between machines, using specialized switches, fiber optic or ethernet connections, and network cards. Without that, even the most powerful chips can’t do much.
Not everyone agrees spending will keep climbing
Not everyone is convinced the spending will keep climbing. Some investors and analysts see it as a gamble, warning of a potential overbuild in which companies pour money into infrastructure that runs too far ahead of demand. There are still plenty of headlines predicting an AI “reckoning.” And as my colleague Shawn Tully has pointed out, the fast-depreciating nature of AI hardware means that there are even greater costs coming down the pike.
But this AI spending race is now in its third year and still shows no signs of slowing. In 2024, the combined capex of the four biggest hyperscalers was just over $200 billion. Two years later, it’s on track to approach $700 billion.
If this is a climb, there’s still no clear view of the summit.
With that, here’s more AI news.
Sharon [email protected] @sharongoldman
This story was originally featured on Fortune.com
...read more
read less